Mutual information formalism

نویسندگان

  • Michael Berens
  • Zheng Zhao
چکیده

As a theoretical basis of mRMR feature selection, we consider a more general feature-selection criterion, maximum dependency (MaxDep).1 In this case, we select the feature set Sm = {f1, f2, ..., fm}, of which the joint statistical distribution is maximally dependent on the distribution of the classification variable c. A convenient way to measure this statistical dependency is mutual information,

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Possible Extension of Shannon's Information Theory

As a possible generalization of Shannon’s information theory, we review the formalism based on the non-logarithmic information content parametrized by a real number q, which exhibits nonadditivity of the associated uncertainty. Moreover it is shown that the establishment of the concept of the mutual information is of importance upon the generalization.

متن کامل

Mutual Information Expansion for Studying the Role of Correlations in Population Codes: How Important Are Autocorrelations?

The role of correlations in the activity of neural populations responding to a set of stimuli can be studied within an information theory framework. Regardless of whether one approaches the problem from an encoding or decoding perspective, the main measures used to study the role of correlations can be derived from a common source: the expansion of the mutual information. Two main formalisms of...

متن کامل

Languages of Quantum Information Theory Typeset Using Revt E X

This note will introduce some notation and deenitions for information the-oretic quantities in the context of quantum systems, such as (conditional) entropy and (conditional) mutual information. We will employ the natural C {algebra formalism, and it turns out that one has an allover dualism of language: we can deene everything for (compatible) observables, but also for (compatible) C {subalgeb...

متن کامل

On Classification of Bivariate Distributions Based on Mutual Information

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

متن کامل

Multi-modal diffeomorphic registration using mutual information: Application to the registration of CT and MR pulmonary images

In this paper, we present a new algorithm to register multimodal images using mutual information in a fully diffeomorphic framework. Our driving motivation is to define a one-to-one mapping in CT/MR 3D pulmonary images acquired from patients with empyema. Due to the large amount of respiratory motion and the presence of strong pathologies, preserving the invertibility of the deformations can be...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005